Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches
نویسندگان
چکیده
AbstractWe explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neural Networks (CNNs), without supervision. We consider variants of the Winner-Takes-All (WTA) strategy explored previous works, i.e. k-WTA, e-soft-WTA and p-soft-WTA, performing experiments on different object recognition datasets. Results suggest that approaches are effective early extraction layers, or re-train higher layers a pre-trained network, with soft competition generally better than other this work. Our findings encourage path cooperation between neuroscience computer science towards deeper investigation biologically inspired principles.KeywordsNeural networksMachine learningHebbian learningCompetitive learningComputer visionBiologically
منابع مشابه
Modular neural networks with Hebbian learning rule
The paper consists of two parts, each of them describing a learning neural network with the same modular architecture and with a similar set of functioning algorithms. Both networks are artificially partitioned into several equal modules according to the number of classes that the network has to recognize. Hebbian learning rule is used for network training. In the first network, learning proces...
متن کاملHebbian Learning in Large Recurrent Neural Networks
This paper presents the guidelines of an ongoing project of the "Movement Dynamics" team in the “Movement and perception” Lab, UMR6152, Marseille. We address the question of Hebbian learning in large recurrent networks. The aim of this research is to present new functional models of learning, through the use of well known methods in a context of high non-linearity and intricate neuronal dynamics.
متن کاملLearning Document Image Features With SqueezeNet Convolutional Neural Network
The classification of various document images is considered an important step towards building a modern digital library or office automation system. Convolutional Neural Network (CNN) classifiers trained with backpropagation are considered to be the current state of the art model for this task. However, there are two major drawbacks for these classifiers: the huge computational power demand for...
متن کاملTowards dropout training for convolutional neural networks
Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in convolutional and pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial distribution at training time. In light of this...
متن کاملSemi-Supervised Training of Convolutional Neural Networks
In this paper we discuss a method for semi-supervised training of CNNs. By using auto-encoders to extract features from unlabeled images, we can train CNNs to accurately classify images with only a small set of labeled images. We show our method’s results on a shallow CNN using the CIFAR-10 dataset, and some preliminary results on a VGG-16 network using the STL-10 dataset.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Lecture Notes in Computer Science
سال: 2022
ISSN: ['1611-3349', '0302-9743']
DOI: https://doi.org/10.1007/978-3-030-95467-3_2